A Vector Space for Distributional Semantics for Entailment

نویسندگان

  • James Henderson
  • Diana Nicoleta Popa
چکیده

Distributional semantics creates vectorspace representations that capture many forms of semantic similarity, but their relation to semantic entailment has been less clear. We propose a vector-space model which provides a formal foundation for a distributional semantics of entailment. Using a mean-field approximation, we develop approximate inference procedures and entailment operators over vectors of probabilities of features being known (versus unknown). We use this framework to reinterpret an existing distributionalsemantic model (Word2Vec) as approximating an entailment-based model of the distributions of words in contexts, thereby predicting lexical entailment relations. In both unsupervised and semi-supervised experiments on hyponymy detection, we get substantial improvements over previous results.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Entailment above the word level in distributional semantics

We introduce two ways to detect entailment using distributional semantic representations of phrases. Our first experiment shows that the entailment relation between adjective-noun constructions and their head nouns (big cat |= cat), once represented as semantic vector pairs, generalizes to lexical entailment among nouns (dog |= animal). Our second experiment shows that a classifier fed semantic...

متن کامل

CDSMs for Semantic Relatedness and Entailment

Distributional Semantics Models (DSMs) have become widely accepted as successful models for lexical semantics. However their extension to handling larger structural units such as entire sentences remains challenging. Compositional DSMs (CDSMs) aim to successfully model sentence semantics by taking into account grammatical structure and logical words, which are ignored by simpler models. We expl...

متن کامل

Learning Word Embeddings for Hyponymy with Entailment-Based Distributional Semantics

Lexical entailment, such as hyponymy, is a fundamental issue in the semantics of natural language. This paper proposes distributional semantic models which efficiently learn word embeddings for entailment, using a recently-proposed framework for modelling entailment in a vectorspace. These models postulate a latent vector for a pseudo-phrase containing two neighbouring word vectors. We investig...

متن کامل

A Context-theoretic Framework for Compositionality in Distributional Semantics

Formalizing “meaning as context” mathematically leads to a new, algebraic theory of meaning, in which composition is bilinear and associative. These properties are shared by other methods that have been proposed in the literature, including the tensor product, vector addition, pointwise multiplication, and matrix multiplication. Entailment can be represented by a vector lattice ordering, inspir...

متن کامل

Riesz Logic

We introduce Riesz Logic, whose models are abelian lattice ordered groups, which generalise Riesz spaces (vector lattices), and show soundness and completeness. Our motivation is to provide a logic for distributional semantics of natural language, where words are typically represented as elements of a vector space whose dimensions correspond to contexts in which words may occur. This basis prov...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1607.03780  شماره 

صفحات  -

تاریخ انتشار 2016